Variance-Reduced Second-Order Methods

نویسندگان

  • Luo Luo
  • Zihao Chen
  • Zhihua Zhang
  • Wu-Jun Li
چکیده

In this paper, we discuss the problem of minimizing the sum of two convex functions: a smooth function plus a non-smooth function. Further, the smooth part can be expressed by the average of a large number of smooth component functions, and the non-smooth part is equipped with a simple proximal mapping. We propose a proximal stochastic second-order method, which is efficient and scalable. It incorporates the Hessian in the smooth part of the function and exploits multistage scheme to reduce the variance of the stochastic gradient. We prove that our method can achieve linear rate of convergence.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Variance-Reduced Cubic Regularized Newton Method

We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an ( , √ )-approximately local minimum within Õ(n/ ) second-order oracl...

متن کامل

CVaR Reduced Fuzzy Variables and Their Second Order Moments

Based on credibilistic value-at-risk (CVaR) of regularfuzzy variable, we introduce a new CVaR reduction method fortype-2 fuzzy variables. The reduced fuzzy variables arecharacterized by parametric possibility distributions. We establishsome useful analytical expressions for mean values and secondorder moments of common reduced fuzzy variables. The convex properties of second order moments with ...

متن کامل

Accelerating SVRG via second-order information

We consider the problem of minimizing an objective function that is a sum of convex functions. For large sums, batch methods suffer from a prohibitive periteration complexity, and are outperformed by incremental methods such as the recent variance-reduced stochastic gradient methods (e.g. SVRG). In this paper, we propose to improve the performance of SVRG by incorporating approximate curvature ...

متن کامل

Sample Complexity of Stochastic Variance-Reduced Cubic Regularization for Nonconvex Optimization

The popular cubic regularization (CR) method converges with firstand second-order optimality guarantee for nonconvex optimization, but encounters a high sample complexity issue for solving large-scale problems. Various sub-sampling variants of CR have been proposed to improve the sample complexity. In this paper, we propose a stochastic variance-reduced cubic-regularized (SVRC) Newton’s method ...

متن کامل

Parametric bootstrap methods for bias correction in linear mixed models

The empirical best linear unbiased predictor (EBLUP) in the linear mixed model (LMM) is useful for the small area estimation, and the estimation of the mean squared error (MSE) of EBLUP is important as a measure of uncertainty of EBLUP. To obtain a second-order unbiased estimator of the MSE, the second-order bias correction has been derived mainly based on Taylor series expansions. However, thi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1602.00223  شماره 

صفحات  -

تاریخ انتشار 2016